4,339 research outputs found

    Convergence of wavelet thresholding estimators of differential operators

    Get PDF
    AbstractWavelet shrinkage is a strategy to obtain a nonlinear approximation to a given signal. The shrinkage method is applied in different areas, including data compression, signal processing and statistics. The almost everywhere convergence of resulting wavelet series has been established in [T. Tao, On the almost everywhere convergence of wavelet summation methods, Appl. Comput. Harmon. Anal. 3 (1996) 384–387] and [T. Tao, B. Vidakovic, Almost everywhere behavior of general wavelet shrinkage operators, Appl. Comput. Harmon. Anal. 9 (2000) 72–82]. With a representation of f′ in terms of wavelet coefficients of f, we are interested in considering the influence of wavelet thresholding to f on its derivative f′. In this paper, for the representation of differential operators in nonstandard form, we establish the almost everywhere convergence of estimators as threshold tends to zero

    Consistency of regularized spectral clustering

    Get PDF
    AbstractClustering is a widely used technique in machine learning, however, relatively little research in consistency of clustering algorithms has been done so far. In this paper we investigate the consistency of the regularized spectral clustering algorithm, which has been proposed recently. It provides a natural out-of-sample extension for spectral clustering. The presence of the regularization term makes our situation different from that in previous work. Our approach is mainly an elaborate analysis of a functional named the clustering objective. Moreover, we establish a convergence rate. The rate depends on the approximation property and the capacity of the reproducing kernel Hilbert space measured by covering numbers. Some new methods are exploited for the analysis since the underlying setting is much more complicated than usual. Some new methods are exploited for the analysis since the underlying setting is much more complicated than usual

    Convergence of subdivision schemes and smoothness of limit functions

    Get PDF
    AbstractStarting with vector λ=(λ(k))k∈Z∈ℓp(Z), the subdivision scheme generates a sequence {Sanλ}n=1∞ of vectors by the subdivision operator Saλ(k)=∑j∈Zλ(j)a(k−2j),k∈Z. Subdivision schemes play an important role in computer graphics and wavelet analysis. It is very interesting to understand under what conditions the sequence {Sanλ}n=1∞ converges to a Lp-function in an appropriate sense. This problem has been studied extensively.In this paper, we consider the convergence of subdivision scheme in Sobolev spaces with the tool of joint spectral radius. Firstly, the conditions under which the sequence {Sanλ}n=1∞ converges to a Wpk-function in an appropriate sense are given. Then, we show that the subdivision scheme converges for any initial vector in Wpk(R) provided that it does for one nonzero vector in that space. Moreover, if the shifts of the refinable function are stable, the smoothness of the limit function corresponding to the vector λ is also independent of λ, where the smoothness of a given function is measured by the generalized Lipschitz space

    Approximation of Nonlinear Functionals Using Deep ReLU Networks

    Full text link
    In recent years, functional neural networks have been proposed and studied in order to approximate nonlinear continuous functionals defined on Lp([−1,1]s)L^p([-1, 1]^s) for integers s≥1s\ge1 and 1≤p<∞1\le p<\infty. However, their theoretical properties are largely unknown beyond universality of approximation or the existing analysis does not apply to the rectified linear unit (ReLU) activation function. To fill in this void, we investigate here the approximation power of functional deep neural networks associated with the ReLU activation function by constructing a continuous piecewise linear interpolation under a simple triangulation. In addition, we establish rates of approximation of the proposed functional deep ReLU networks under mild regularity conditions. Finally, our study may also shed some light on the understanding of functional data learning algorithms

    A Simpler Approach to Coefficient Regularized Support Vector Machines Regression

    Get PDF
    We consider a kind of support vector machines regression (SVMR) algorithms associated with lq  (1≤q<∞) coefficient-based regularization and data-dependent hypothesis space. Compared with former literature, we provide here a simpler convergence analysis for those algorithms. The novelty of our analysis lies in the estimation of the hypothesis error, which is implemented by setting a stepping stone between the coefficient regularized SVMR and the classical SVMR. An explicit learning rate is then derived under very mild conditions

    Hawking radiation of Dirac particles via tunneling from Kerr black hole

    Full text link
    We investigated Dirac Particles' Hawking radiation from event horizon of Kerr black hole in terms of the tunneling formalism. Applying WKB approximation to the general covariant Dirac equation in Kerr spacetime background, we obtain the tunneling probability for fermions and Hawking temperature of Kerr black hole. The result obtained by taking the fermion tunneling into account is consistent with the previous literatures.Comment: 7 pages, no figures, to appear in CQ

    Fermions Tunneling from Apparent Horizon of FRW Universe

    Full text link
    In the paper [arXiv:0809.1554], the scalar particles' Hawking radiation from the apparent horizon of Friedmann-Robertson-Walker(FRW) universe was investigated by using the tunneling formalism. They obtained the Hawking temperature associated with the apparent horizon, which was extensively applied in investigating the relationship between the first law of thermodynamics and Friedmann equations. In this paper, we calculate Fermions' Hawking radiation from the apparent horizon of FRW universe via tunneling formalism. Applying WKB approximation to the general covariant Dirac equation in FRW spacetime background, the radiation spectrum and Hawking temperature of apparent horizon are correctly recovered, which supports the arguments presented in the paper [arXiv:0809.1554].Comment: 8 pages, no figure

    The effect of epidural analgesia on maternal-neonatal outcomes: a retrospective study

    Get PDF
    Objectives: Epidural analgesia is commonly used for relieving labor pain in contemporary clinical practice. The rate of pregnant women who request epidural analgesia during labor has been increasing annually, leading to a debate on the effect of epidural analgesia on maternal or neonatal outcomes.Material and methods: The medical records of nulliparous women with a term singleton pregnancy from January to December 2019 at the Affiliated Hospital of Zunyi Medical University were retrospectively reviewed. The women were divided into those who received epidural analgesia during delivery and those who did not receive it. Maternal and neonatal outcomes were assessed.Results: A total of 528 women met the inclusion criteria. The overall labor analgesia rate was 43.0% (227). Women with epidural analgesia had a significantly longer second stage [34.5 (22.8–65.3) vs 27.0 (18.0–41.3) min, p &lt; 0.001] and total duration of labor [698.5 (493.5–875.0) vs 489.5 (344.0-676.3) min, p &lt; 0.001] compared with those without epidural. There were no significant relationships between epidural analgesia and the normal vaginal delivery rate, the incidence of episiotomy, and other adverse maternal or neonatal outcomes (p &gt; 0.05).Conclusions: Epidural analgesia can prolong the second stage of labor, but this is no increased risk for both mother and neonate
    • …
    corecore